List of AI News about 10 million tokens
| Time | Details |
|---|---|
|
2025-09-17 03:00 |
Google's ATLAS Language Model Sets New Benchmark with Trainable Memory Module for 10 Million-Token Inputs
According to DeepLearning.AI, Google researchers have unveiled ATLAS, a groundbreaking language model architecture that replaces traditional attention mechanisms with a trainable memory module, enabling the processing of inputs up to 10 million tokens (source: DeepLearning.AI). The 1.3 billion-parameter model was trained on the FineWeb dataset, with only the memory module being updated during inference, significantly improving efficiency. ATLAS achieved an impressive 80 percent score on the BABILong benchmark for long-context understanding and averaged 57.62 percent across eight QA benchmarks, outperforming competing models like Titans and Transformer++ (source: DeepLearning.AI). This breakthrough opens up new business opportunities for AI applications requiring long-context reasoning, such as legal document analysis, enterprise search, and large-scale data summarization. |